Exploring Three Recurrent Neural Network Architectures for Geomagnetic Predictions

نویسندگان

چکیده

Three different recurrent neural network (RNN) architectures are studied for the prediction of geomagnetic activity. The RNNs Elman, gated unit (GRU), and long short-term memory (LSTM). take solar wind data as inputs to predict Dst index. index summarizes complex processes into a single time series. models trained tested using five-fold cross-validation based on hourly resolution OMNI dataset from years 1995–2015. plasma (particle density speed), vector magnetic fields, year, day. regularized early stopping dropout. We find that both perform better than Elman model; however, we see no significant difference in performance between GRU LSTM. with dropout require more weights reach same validation error networks without However, gap training becomes smaller when is applied, reducing over-fitting improving generalization. Another advantage it can be applied during provide confidence limits predictions. increase increasing magnitude: consequence less populated input-target space events large values, thereby uncertainty estimates. best have test set RMSE 8.8 nT, bias close zero, linear correlation 0.90.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Optimizing and Contrasting Recurrent Neural Network Architectures

Recurrent Neural Networks (RNNs) have long been recognized for their potential to model complex time series. However, it remains to be determined what optimization techniques and recurrent architectures can be used to best realize this potential. The experiments presented take a deep look into Hessian free optimization, a powerful second order optimization method that has shown promising result...

متن کامل

A comparison between recurrent neural network architectures for digital equalization

This paper shows a comparison between three di erent rst-order recurrent neural network (RNN) architectures (fully recurrent, partially recurrent, and Elman), trained using the real-time recurrent learning (RTRL) algorithm and the GSM training sequence ratio (26=114) for digital equalization of 2-ary PAM signals. The results show no substantial e ect of the particular architecture or the number...

متن کامل

A Recurrent Neural Network Model for Solving Linear Semidefinite Programming

In this paper we solve a wide rang of Semidefinite Programming (SDP) Problem by using Recurrent Neural Networks (RNNs). SDP is an important numerical tool for analysis and synthesis in systems and control theory. First we reformulate the problem to a linear programming problem, second we reformulate it to a first order system of ordinary differential equations. Then a recurrent neural network...

متن کامل

Explaining Recurrent Neural Network Predictions in Sentiment Analysis

Recently, a technique called Layer-wise Relevance Propagation (LRP) was shown to deliver insightful explanations in the form of input space relevances for understanding feed-forward neural network classification decisions. In the present work, we extend the usage of LRP to recurrent neural networks. We propose a specific propagation rule applicable to multiplicative connections as they arise in...

متن کامل

New Recurrent Neural Architectures

This paper presents two new neural networks, the TASM (Temporal Associative Subject Memory) and the SelfRecurrent network, described as a complex types of recurrent organisms. After a short general definition of recurrent neural networks we introduce the theoretical structure of the new architectures. The paper shows two relevant applications on an medical datasets which show the good classific...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Frontiers in Astronomy and Space Sciences

سال: 2021

ISSN: ['2296-987X']

DOI: https://doi.org/10.3389/fspas.2021.664483